Estimation Procedure for Reduced Rank Regression, PLSSVD

نویسندگان

  • Willin Álvarez
  • Victor Griffin
چکیده

This paper presents a procedure for coefficient estimation in a multivariate regression model of reduced rank in the presence of multicollinearity. The procedure permits the prediction of the dependent variables taking advantage of both Partial Least Squares (PLS) and Singular Value Decomposition (SVD) methods, which is denoted by PLSSVD. Global variability indices and prediction error sums are used to compare the results obtained by classical regression with reduced rank (OLSSVD) and the PLSSVD procedure when applied to examples with different grades of multicollinearity (severe, moderate and low). In addition, simulations to compare the methods were performed with different sample sizes under four scenarios. The new PLSSVD method is shown to be more effective when the multicollinearity is severe and especially for small sample sizes.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Biplots in Reduced - Rank Regression

2 SUMMARY Regression problems with a number of related response variables are typically analyzed by separate multiple regressions. This paper shows how these regressions can be visualized jointly in a biplot based on reduced-rank regression. Reduced-rank regression combines multiple regression and principal components analysis and can therefore be carried out with standard statistical packages....

متن کامل

On the Estimation of Reduced Rank Regressions

It is well-know that estimation by reduced rank regression is given by the solution to a generalized eigenvalue problem. This paper presents a new proof to establish this result and provides additional insight into the structure of the estimation problem. The proof is a direct algebraic proof that some might find more intuitive than existing proofs. JEL Classification: C3, C32

متن کامل

Maximum Likelihood Estimation of the I(2) Model under Linear Restrictions

Estimation of the I(2) cointegrated vector autoregressive (CVAR) model is considered. Without further restrictions, estimation of the I(1) model is by reduced-rank regression (Anderson (1951)). Maximum likelihood estimation of I(2) models, on the other hand, always requires iteration. This paper presents a new triangular representation of the I(2) model. This is the basis for a new estimation p...

متن کامل

Scalable Reduced-rank and Sparse Regression

Reduced-rank regression, i.e., multi-task regression subject to a low-rank constraint, is an effective approach to reduce the number of observations required for estimation consistency. However, it is still possible for the estimated singular vectors to be inconsistent in high dimensions as the number of predictors go to infinity with a faster rate than the number of available observations. Spa...

متن کامل

Multivariate reduced rank regression in non-Gaussian contexts, using copulas

We propose a new procedure to perform Reduced Rank Regression (RRR) in nonGaussian contexts, based on Multivariate Dispersion Models. Reduced-Rank Multivariate Dispersion Models (RR-MDM) generalise RRR to a very large class of distributions, which include continuous distributions like the normal, Gamma, Inverse Gaussian, and discrete distributions like the Poisson and the binomial. A multivaria...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016